|
|
||
|
Market Roundup May 31, 2002 |
||
|
|
||
|
Intel Projects Itanium 2 Performance Pew Internet Project Releases Internet
Health Study Carnivore Bites Off More than the
FBI Can Chew Enterprise Linux Players
Introduce the UnitedLinux Initiative |
||
|
|
||
|
|
||
By Charles King Intel has announced performance estimates for its
forthcoming Itanium 2 processor (codenamed McKinley) based on tests of
prototype systems. According to Intel, systems based on the new processor are
expected to deliver up to 1.5 to 2 times the performance of current
Itanium-based systems, and significantly better performance than comparable
Sun UltraSPARC III systems across a range of enterprise and technical
applications including ERP, databases, secure transaction processing,
scientific and technical computing, and MCAD. The Itanium 2 includes design
enhancements such as 3MB of on-die level (L3) 3 cache and additional
execution units and level ports, and improved data speeds including a threefold
increase in system bus bandwidth, 1GHz frequency, and improved cache
latencies. The Itanium architecture also maintains generation-to-generation
software compatibility, enabling applications compiled for the Itanium
processor to run on the Itanium 2 with significant performance increases. The
Itanium 2 is expected to become available in mid-2002. Processor performance comparisons rest on a slippery
technical slope that we prefer to tread on carefully, since tests can be
skewed by chipset and server architectures, the applications used, and the
testers themselves, as well as a host of incremental influences including
humidity, barometric pressure, planetary alignment, caffeine consumption,
feng shui, and generic wishful thinking. Though performance projection pushes
the aforementioned slippery slope up a couple of degrees, Intel’s
announcement includes issues that are worthy of further discussion. The first
is strategic product positioning, which is revealed by the test use of Sun’s
UltraSPARC III as Itanium 2’s primary RISC platform/target for comparison.
This suggests two things. First, that Intel is fully cognizant of the areas
where its customers hope to engage and best Sun with Intel-based products,
and has focused its efforts on developing a solution that is up to the task.
Sun will likely scoff at this, since it considers Intel-based products to be
best used at the network edge, far from the data center and enterprise
backend where heavy lifting computing (such as Sun’s) is required. That
attitude may be considered simple arrogance or whistling past the graveyard on
Sun’s part. Over the past year, Sun has increasingly focused its attentions
on IBM as its primary competitor/nemesis. If the Itanium 2’s projected
performance holds up in real life applications, Sun may find itself up
against an able foe it has largely underrated. Second, Intel’s willingness to compare (sometimes brutally) the Itanium 2 against first generation Itanium chips suggests to us that the company is ready and willing to publicly face concerns about the Merced chip’s performance. Over the past few months, concerns have arisen over Intel’s ability to deliver effective high end computing solutions. While the real world capabilities of the Itanium 2 are yet unclear, Intel’s projected performance data should give company partners such as Dell, IBM, Unisys, and Fujitsu a lot to smile about. The new HP should head up that list. Since the company has publicly committed to migrating three product platforms to Intel architecture, it stands to gain more from the Itanium 2’s enhancements than any other vendor. |
|
||
News reports indicate that Microsoft conducted a
marketing survey with the registered owners of its Xbox game console, in
which it asked a number of questions about using the Xbox as a means to
access files stored on a PC in the home. According to the reports, the survey
asked if users would be interested in a $29 connection kit that would allow
for PC-stored files to be viewed or played through the Xbox console and a
television. Microsoft confirmed that it had conducted the survey but
apparently refused to confirm that a product is under development at this
time, even though the survey indicated the connection kit could be available
in a year. Nice try on the denial, folks. Even if Microsoft
does not offer this kind of connection, some third party will try. It is
crystal clear Microsoft is hoping the Xbox is another means by which to
expand its market from the desktop into the living room. If it can find a way
to leverage the two, one suspects the company will attempt to do so. When one
considers the recent Microsoft announcements of a cut in the Xbox price and
the development of a broadband Internet gaming network that can be accessed
through the Xbox and a cable modem, the logical connection — both figurative
and literal — between the Xbox and the PC makes sense. Hell, it’s just plain
obvious. It is also obvious that linking the Internet to all parts of the home — through a variety of devices other than the PC — is going to make online content and services all the more valuable and desirable. Music files, videos, and games would find themselves in search of new markets as the devices become interconnected throughout the home. Which makes us take a new look at the increasingly panicked efforts to rein in Internet piracy, especially with respect to entertainment content such as music and movies. Rapper Eminem’s latest release was pushed up weeks because of the extraordinarily high volume of pirated copies of the album being available online. Bootleg copies of the latest Star Wars epic are also widely available. While the entertainment industry struggles to find ways to prevent such content piracy — efforts that we tend to see as largely doomed to failure — the vendors of new devices and software are making that content all the more valuable and portable. Thus, entertainment industry efforts to block transfer of digital content are swimming upstream against a river of new technologies that that ironically could make that very content worth more than it was yesterday. For the music and movie industry, it’s time to figure out how to get in the boat once and for all or face becoming washed up downstream. |
|
||
Pew Internet Project Releases Internet
Health Study In its latest national survey conducted during March
2002, the Pew Internet Project found that 62% of Internet users, or 73
million people in the Searching for medical information on the Internet is
hardly new; in fact, one of the highest profile early dotcom ventures was
DrKoop.com. Unfortunately for investors, this site eventually flew its own
coop financially. Millions flocked to the Internet in the quest for
information and guidance on sundry matters, with health being one of the most
popular. But as large as the figure of 73 million people might sound, we need
to remember that this represents about one-quarter of the 287+ million living
in the In an era when a significant number of people lack health insurance, a $100+ visit to doctor can be easily rationalized as unnecessary if one believes effective answers at minimal cost reside on the Internet. The danger of this notion, of course, is that practicing medicine on oneself is not at all objective and can have serious, even fatal consequences. So while many might hail the emergence of the Internet as a medical resource for growing numbers of Americans, who will note that half as many citizens lack basic health insurance and therefore access to preventative medicine? There are discussions aplenty about the digital divide, but in some common sense moments, one can reflect on the reality that while our society lauds the success of having 25% of its population use the Internet to access medical information, it seems all too often blind to the 13% of its population that lacks cost effective access to health care. In the greater scheme of things, the digital divide just seems trivial by comparison. |
|
||
Carnivore Bites Off More than the
FBI Can Chew The FBI announced this week — among a number of
organizational changes — that the agency’s Carnivore email monitoring system
was not working properly in March 2000. Not only was the system switched off,
but all of the data it gathered in the form of intercepted emails was erased.
Some of that data included emails that were the target of the Carnivore
system, including emails pertaining to Osama Bin Laden. The FBI said that a
technician turned off the system and deleted the data it had captured after
realizing that the system was collecting information that was not authorized
and that this might “contaminate ongoing investigations.” Carnivore works by
monitoring and capturing data packets as they flow through ISPs, and has been
widely criticized by civil libertarians who argue that such a tool invades
innocent citizens’ privacy. News reports said revelations that Carnivore was
not functioning properly were dismissed by FBI officials, who apparently
claimed the mistake was unusual and that its source was an ISP, not the FBI.
Meanwhile, the FBI is on the verge of receiving broader powers to monitor
domestic communications in an attempt to strengthen its hand against ongoing
terrorist threats. Additionally, as part of its overhaul, FBI officials also
announced they were actively recruiting more technology-savvy agents in an
attempt to combat terrorism with Internet-related technologies. The FBI — desperate to find new ways to head off the
“inevitable” terrorist attacks of the future — has apparently decided that
the Internet and a presence therein will be useful targets for allocation of
resources. But this week’s revelations about Carnivore’s operational
limitations — and the agency’s stated desire to beef up its Internet presence
— rings oddly familiar to us. Private sector enterprises have been struggling
for years, with improved degrees of success, to allocate the proper amount of
resources to Internet ventures and technology. They now seem to be getting
the hang of it. Perhaps not so with the FBI, however. As the Carnivore
episode clearly shows, technology in itself is not an end-all and be-all. It
has to be applied in a fashion that meets finite and discernible goals. If it
is allowed to stray, as the Carnivore system apparently did, it becomes a
liability, not an asset. Those opposed to Carnivore certainly have reason to feel vindicated. Historically, the FBI has exceeded its authority and conducted wide-ranging investigations of American citizens who in fact posed no threat to the nation or security. Just because the agency has the capacity to read anyone’s email doesn’t mean they should have the right to do so, the argument goes. To make that argument stick, however, we foresee the need for much broader oversight. New technology has ripple effects throughout a society; the effects of Carnivore and similar technologies will force the agency and its overseers to mount much more sustained and real-time efforts to ensure that the FBI stays within its mandate. The effectiveness of this oversight — be it from the courts or the Congress — will be a direct result of these entities’ familiarity with the nuances of IT and its repercussions. So while we see the FBI’s rush to become more cyber-savvy laudable, we believe that it will not be truly a balanced approach until both the courts and Congress can observe and judge the appropriateness of FBI technology initiatives with a full and complete understanding of what is at hand and at stake. |
|
||
Caldera, Connectiva S.A., SuSE Linux and Turbolinux
announced UnitedLinux, an initiative the companies claim will streamline
Linux development and certification. Under the terms of the agreement, the
four companies will collaborate on the development of UnitedLinux software, a
common core Linux operating environment that is designed for business and
certified to work across enterprise hardware and software platforms. The four
partners will each bundle value-added products and services with the
UnitedLinux OS, and sell the resulting offering under their own brands.
Who is missing (RedHat) from UnitedLinux says as
much about the project as who is involved. Since deciding a year or so ago to
evolve from being a pure ISV into a Linux consultant and service provider,
RedHat has ridden the wake of the enterprise Linux wave behind formidable
buddies like IBM and HP. That has translated into good business and a growing
footprint for RedHat, but the situation understandably irks Caldera et al,
whose own Linux business solutions have been taking on water. UnitedLinux can
be rightly regarded as a way these smaller Linux players are trying to find
some safety (and market traction) in numbers. From the looks of their party
list, which contains most every Linux-friendly vendor on the planet, they
might even have a chance to succeed. Though to be honest, getting a pat on
the back from said vendors makes the patters look supportive to partners and
customers alike without guaranteeing a thing to the pattee. Guarantees aside, we wonder if UnitedLinux portends a new, possibly darker direction for Linux development. For a project that began as an unlikely mix of high-tech sophistication and communal sensibilities, Linux has remained remarkably true to its open source roots, but some cracks are beginning to show. The past year has seen sniping within the community of Linux volunteers over divisions of labor and responsibility. In fact, questions have arisen over whether Linux has become too complex to manage or even maintain on a volunteer basis. At the same time, with the implicit and enthusiastic assistance of vendors like IBM, HP, and lately Sun, the presence of Linux in enterprises continues to grow, adding further complexity to an operating environment that already seems bedeviled by complexity. The decision of the UnitedLinux partners to develop a Linux operating environment “designed for business” has an oddly familiar ring to it. It was not so long ago that UNIX appeared on the scene as a free, open operating system that was eventually co-opted for commercial, proprietary use by hardware vendors from IBM to HP to Sun. We would hate to be accused of mere cynicism, but we wonder if the emergence of the new business-friendly UnitedLinux platform may eventually serve as a means to privatize open source solutions. |
|
||